5,715 research outputs found
Recommended from our members
Performance bounds for particle filters using the optimal proposal
Particle filters may suffer from degeneracy of the particle weights. For the simplest "bootstrap" filter, it is known that avoiding degeneracy in large systems requires that the ensemble size must increase exponentially with the variance of the observation log-likelihood. The present article shows first that a similar result applies to particle filters using sequential importance sampling and the optimal proposal distribution and, second, that the optimal proposal yields minimal degeneracy when compared to any other proposal distribution that depends only on the previous state and the most recent observations. Thus, the optimal proposal provides performance bounds for filters using sequential importance sampling and any such proposal. An example with independent and identically distributed degrees of freedom illustrates both the need for exponentially large ensemble size with the optimal proposal as the system dimension increases and the potentially dramatic advantages of the optimal proposal relative to simpler proposals. Those advantages depend crucially on the magnitude of the system noise
Bosonic Massless Higher Spin Fields from Matrix Model
We study matrix models as a new approach to formulate massless higher spin
gauge field theory. As a first step in this direction, we show that the free
equation of motion of bosonic massless higher spin gauge fields can be derived
from that of a matrix model.Comment: 19pages, no figures, one reference adde
TarTar: A Timed Automata Repair Tool
We present TarTar, an automatic repair analysis tool that, given a timed
diagnostic trace (TDT) obtained during the model checking of a timed automaton
model, suggests possible syntactic repairs of the analyzed model. The suggested
repairs include modified values for clock bounds in location invariants and
transition guards, adding or removing clock resets, etc. The proposed repairs
are guaranteed to eliminate executability of the given TDT, while preserving
the overall functional behavior of the system. We give insights into the design
and architecture of TarTar, and show that it can successfully repair 69% of the
seeded errors in system models taken from a diverse suite of case studies.Comment: 15 pages, 7 figure
Recommended from our members
Does eye examination order for standard automated perimetry matter?
PURPOSE: In spite of faster examination procedures, visual field (VF) results are potentially influenced by fatigue. We use large-scale VF data collected from clinics to test the hypothesis that perimetric fatigue effects are greater in the eye examined second. METHODS: Series of six Humphrey Swedish Interactive Testing Algorithm (SITA) VFs from 6901 patients were retrospectively extracted from a VF database from four different glaucoma clinics. Mean deviation (MD) was compared between first and second tested eyes. A surrogate measure of longitudinal MD variability over time was estimated from errors using linear regression of MD against time then compared between first and second tested eye. RESULTS: Right eye VF was tested consistently first throughout in 6320 (91.6%) patients. Median (interquartile range; IQR) MD in the first tested (right) eye and second tested (left) eye was -2.57 (-6.15, -0.58) dB and -2.70 (-6.34, -0.80) dB respectively (median reduction VF sensitivity of 0.13 dB; p < 0.001). Median (IQR) increase in our surrogate measure of longitudinal MD variability in the second eye tested was 3% (-43%, 50%); this effect was not associated with patient age or rest time between examinations. CONCLUSION: Statistically significant perimetric fatigue effects manifest on average in the second eye tested in routine clinics using Humphrey Field Analyzer SITA examinations. However, the average effects were very small and there was enormous variation among patients. We recommend starting with a right eye examination so that any perimetric fatigue effects, if they exist in an individual, will be as constant as possible from visit to visit
Extreme value modelling of storm damage in Swedish forests
International audienceForests cover about 56% of the land area in Sweden and forest damage due to strong winds has been a recurring problem. In this paper we analyse recorded storm damage in Swedish forests for the years 1965?2007. During the period 48 individual storm events with a total damage of 164 Mm³ have been reported with the severe storm on 8 to 9 January 2005, as the worst with 70 Mm³ damaged forest. For the analysis, storm damage data has been normalised to account for the increase in total forest volume over the period. We show that, within the framework of statistical extreme value theory, a Poisson point process model can be used to describe these storm damage events. Damage data supports a heavy-tailed distribution with great variability in damage for the worst storm events. According to the model, and in view of available data, the return period for a storm with damage in size of the severe storm of January 2005 is approximately 80 years, i.e. a storm with damage of this magnitude will happen, on average, once every eighty years. To investigate a possible temporal trend, models with time-dependent parameters have been analysed but give no conclusive evidence of an increasing trend in the normalised storm damage data for the period. Using a non-parametric approach with a kernel based local-likelihood method gives the same result
A trick for passing degenerate points in Ashtekar formulation
We examine one of the advantages of Ashtekar's formulation of general
relativity: a tractability of degenerate points from the point of view of
following the dynamics of classical spacetime. Assuming that all dynamical
variables are finite, we conclude that an essential trick for such a continuous
evolution is in complexifying variables. In order to restrict the complex
region locally, we propose some `reality recovering' conditions on spacetime.
Using a degenerate solution derived by pull-back technique, and integrating the
dynamical equations numerically, we show that this idea works in an actual
dynamical problem. We also discuss some features of these applications.Comment: 9 pages by RevTeX or 16 pages by LaTeX, 3 eps figures and epsf-style
file are include
Squashing Models for Optical Measurements in Quantum Communication
Measurements with photodetectors necessarily need to be described in the
infinite dimensional Fock space of one or several modes. For some measurements
a model has been postulated which describes the full mode measurement as a
composition of a mapping (squashing) of the signal into a small dimensional
Hilbert space followed by a specified target measurement. We present a
formalism to investigate whether a given measurement pair of mode and target
measurements can be connected by a squashing model. We show that the
measurements used in the BB84 protocol do allow a squashing description,
although the six-state protocol does not. As a result, security proofs for the
BB84 protocol can be based on the assumption that the eavesdropper forwards at
most one photon, while the same does not hold for the six-state protocol.Comment: 4 pages, 2 figures. Fixed a typographical error. Replaced the
six-state protocol counter-example. Conclusions of the paper are unchange
Information geometry of density matrices and state estimation
Given a pure state vector |x> and a density matrix rho, the function
p(x|rho)= defines a probability density on the space of pure states
parameterised by density matrices. The associated Fisher-Rao information
measure is used to define a unitary invariant Riemannian metric on the space of
density matrices. An alternative derivation of the metric, based on square-root
density matrices and trace norms, is provided. This is applied to the problem
of quantum-state estimation. In the simplest case of unitary parameter
estimation, new higher-order corrections to the uncertainty relations,
applicable to general mixed states, are derived.Comment: published versio
The reality conditions for the new canonical variables of General Relativity
We examine the constraints and the reality conditions that have to be imposed
in the canonical theory of 4--d gravity formulated in terms of Ashtekar
variables. We find that the polynomial reality conditions are consistent with
the constraints, and make the theory equivalent to Einstein's, as long as the
inverse metric is not degenerate; when it is degenerate, reality conditions
cannot be consistently imposed in general, and the theory describes complex
general relativity.Comment: 11
A Note on trapped Surfaces in the Vaidya Solution
The Vaidya solution describes the gravitational collapse of a finite shell of
incoherent radiation falling into flat spacetime and giving rise to a
Schwarzschild black hole. There has been a question whether closed trapped
surfaces can extend into the flat region (whereas closed outer trapped surfaces
certainly can). For the special case of self-similar collapse we show that the
answer is yes, if and only if the mass function rises fast enough.Comment: 14 pages, 4 figures; minor polish added to version
- …